Potentials for Denumerable Markov Chains
نویسندگان
چکیده
Probabilistic generalizations of classical potential theory have been worked out by J. L. Doob [ l ] and G. A. Hunt [2]. Specialized to discrete processes, this provides a potential theory for transient Markov chains, analogous to the theory of the Newtonian potential. We provide a unified potential theory for all denumerable Markov chains; as applied to recurrent chains, the theory generalizes classical results on logarithmic potentials. We denote by P the transition matrix of a Markov chain with states the integers. DEFINITION. A function of states ƒ is called a right potential charge if it is integrable with respect to a given non-negative superregular measure a ( a P g a ) , and if the! limit g = l i m n ( / + P + • • • +P )f exists. Then g is called a right potential with charge/ . If fx is a finite measure and p = limn j i i ( / + P + • • • +P ) then v is called a left potential with charge JJL. The mapping ƒ—>ju, with ju» =ƒ,<*» is an isomorphism preserving all the interesting properties. E.g., ƒ is a right charge for P if and only if /x is a left charge for the so-called reverse chain. Since the reverse chains include all Markov chains, we automatically obtain for each theorem about functions a dual theorem about measures. In the transient case potentials can be represented by means of a positive potential operator G = I+P+P+ • • • , as g — Gf or v=iiG. In the recurrent case we show that af=0 is a necessary condition, or dually, /x must have total measure 0. In this case we have dual positive operators
منابع مشابه
On Weak Lumpability of Denumerable Markov Chains
We consider weak lumpability of denumerable discrete or continuous time Markov chains. Firstly, we are concerned with irreducible recurrent positive and R-positive Markov chains evolving in discrete time. We study the properties of the set of all initial distributions of the starting chain leading to an aggregated homogeneous Markov chain with respect to a partition of the state space. In parti...
متن کاملMonotonicity and Convexity of Some Functions Associated with Denumerable Markov Chains and Their Applications to Queueing Systems
Motivated by various applications in queueing theory, this paper is devoted to the monotonicity and convexity of some functions associated with discrete-time or continuous-time denumerable Markov chains. For the discrete-time case, conditions for the monotonicity and convexity of the functions are obtained by using the properties of stochastic dominance and monotone matrix. For the continuous-t...
متن کاملA note on exponential stability of the nonlinear filter for denumerable Markov chains
We study asymptotic stability of the optimal filter with respect to its initial conditions. We show that exponential stability of the nonlinear filter holds for a large class of denumerable Markov chains, including all finite Markov chains, under the assumption that the observation function is one-to-one and the observation noise is sufficiently small. Ergodicity of the signal process is not as...
متن کاملar X iv : 0 90 7 . 45 02 v 1 [ m at h . PR ] 2 6 Ju l 2 00 9 On Markov chains induced by partitioned transition probability matrices
Let S be a denumerable state space and let P be a transition probability matrix on S. If a denumerable set M of nonnegative matrices is such that the sum of the matrices is equal to P , then we call M a partition of P . Let K denote the set of probability vectors on S. To every partition M of P we can associate a transition probability function PM on K defined in such a way that if p ∈ K and M ...
متن کاملPerturbation Analysis for Denumerable Markov Chains with Application to Queueing Models
We study the parametric perturbation of Markov chains with denumerable state spaces. We consider both regular and singular perturbations. By the latter wemean that transition probabilities of aMarkov chain, with several ergodic classes, are perturbed such that (rare) transitions among the different ergodic classes of the unperturbed chain are allowed. Singularly perturbed Markov chains have bee...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007